Accelerated Sparse Linear Regression via Random Projection

نویسندگان

  • Weizhong Zhang
  • Lijun Zhang
  • Rong Jin
  • Deng Cai
  • Xiaofei He
چکیده

In this paper, we present an accelerated numerical method based on random projection for sparse linear regression. Previous studies have shown that under appropriate conditions, gradient-based methods enjoy a geometric convergence rate when applied to this problem. However, the time complexity of evaluating the gradient is as large as O(nd), where n is the number of data points and d is the dimensionality, making those methods inefficient for large-scale and highdimensional dataset. To address this limitation, we first utilize random projection to find a rank-k approximator for the data matrix, and reduce the cost of gradient evaluation to O(nk + dk), a significant improvement when k is much smaller than d and n. Then, we solve the sparse linear regression problem via a proximal gradient method with a homotopy strategy to generate sparse intermediate solutions. Theoretical analysis shows that our method also achieves a global geometric convergence rate, and moreover the sparsity of all the intermediate solutions are well-bounded over the iterations. Finally, we conduct experiments to demonstrate the efficiency of the proposed method.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Accelerated Projected Gradient Method for Linear Inverse Problems with Sparsity Constraints

Regularization of ill-posed linear inverse problems via l1 penalization has been proposed for cases where the solution is known to be (almost) sparse. One way to obtain the minimizer of such an l1 penalized functional is via an iterative soft-thresholding algorithm. We propose an alternative implementation to l1-constraints, using a gradient method, with projection on l1-balls. The correspondin...

متن کامل

Estimation in High - Dimensional Linear Models with Deterministic Design Matrices

Because of the advance in technologies, modern statistical studies often encounter linear models with the number of explanatory variables much larger than the sample size. Estimation and variable selection in these high-dimensional problems with deterministic design points is very different from those in the case of random covariates, due to the identifiability of the high-dimensional regressio...

متن کامل

Sampling Requirements and Accelerated Schemes for Sparse Linear Regression with Orthogonal Least-Squares

The Orthogonal Least Squares (OLS) algorithm sequentially selects columns of the coefficient matrix to greedily find an approximate sparse solution to an underdetermined system of linear equations. Previous work on the analysis of OLS has been limited; in particular, there exist no guarantees on the performance of OLS for sparse linear regression from random measurements. In this paper, the pro...

متن کامل

Semi-supervised Feature Selection via Rescaled Linear Regression

With the rapid increase of complex and highdimensional sparse data, demands for new methods to select features by exploiting both labeled and unlabeled data have increased. Least regression based feature selection methods usually learn a projection matrix and evaluate the importances of features using the projection matrix, which is lack of theoretical explanation. Moreover, these methods canno...

متن کامل

Gene Expression Profile Classification in Random Feature Space

In this study, gene expression profile classification is done via sparse representation in the random feature Space, which is obtained by either random projection or nonlinear random mapping used in Extreme learning machine (ELM). The numerical experiment shows that sparse representation has slightly better performance than ELM.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016